Editorial: High-performance tensor computations in scientific computing and data science
نویسندگان
چکیده
EDITORIAL article Front. Appl. Math. Stat., 23 September 2022Sec. Mathematics of Computation and Data Science https://doi.org/10.3389/fams.2022.1038885
منابع مشابه
Big Data in High Performance Scientific Computing
Computation in science and engineering can be regarded as a focal point around which "revolves" the need to have knowledge of computer architecture, operating system, network technology, programming language, numerical algorithms, mathematical model, physical phenomena and theory of computation, respectively. If we discuss about the cluster or the computing center, and not about the supercomput...
متن کاملScientific and high-performance computing at FAIR
Future FAIR experiments have to deal with very high input rates, large track multiplicities, make full event reconstruction and selection on-line on a large dedicated computer farm equipped with heterogeneous many-core CPU/GPU compute nodes. To develop efficient and fast algorithms, which are optimized for parallel computations, is a challenge for the groups of experts dealing with the HPC comp...
متن کاملHybrid programming in high performance scientific computing
An application programmer interface (API) is developed to facilitate, via OpenMP, the par allelization of the double precision general matrix multiply routine called from within GAMESS [1] during the execution of the coupled-cluster module for calculating physical properties of molecules. Results are reported using the ATLAS library and the Intel MKL on an Intel machine, and using the ESSL and...
متن کاملMatrix Computations & Scientific Computing Seminar
Extracting useful information from high-dimensional data is the focus of today’s statistical research and practice. After broad success of statistical machine learning on prediction through regularization, interpretability is gaining attention and sparsity has been used as its proxy. With the virtues of both regularization and sparsity, Lasso (L1 penalized L2 minimization) and its extensions ha...
متن کاملMatrix Computations and Scientific Computing Seminar
The L1-regularized Gaussian maximum likelihood estimator has been shown to have strong statistical guarantees in recovering a sparse inverse covariance matrix even under high-dimensional settings. However, it requires solving a difficult non-smooth log-determinant program with number of parameters that scale quadratically with the number of Gaussian variables. Earlier methods thus do not scale ...
متن کاملذخیره در منابع من
با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید
ژورنال
عنوان ژورنال: Frontiers in Applied Mathematics and Statistics
سال: 2022
ISSN: ['2297-4687']
DOI: https://doi.org/10.3389/fams.2022.1038885